233 research outputs found

    Standard method use in contemporary IS development: an empirical investigation

    No full text
    Purpose – The purpose of this research is to obtain an updated assessment of the use of standard methods in IS development practice in New Zealand, and to compare these practices to those reported elsewhere. Design/methodology/approach – A web-based survey of IS development practices in New Zealand organisations with 200 or more full-time employees was conducted. The results of the survey were compared to prior studies from other national contexts. Findings – The results suggest that levels of standard method use continue to be high in New Zealand organisations, although methods are often used in a pragmatic or ad hoc way. Further, the type of method used maps to a shift from bespoke development to system acquisition or outsourcing. Organisations that reported using standard methods perceived them to be beneficial to IS development in their recent IS projects, and generally disagreed with most of the published limitations of standard methods. Research limitations/implications – As the intent was to consider only New Zealand organisations, the results of the survey cannot be generalised further afield. More comparative research is needed to establish whether the trends identified here occur at a wider regional or international level. Practical implications – A significant proportion of organisations anticipated extending their use of standard methods. Growth in packaged software acquisition and outsourced development suggests an increasing need for deployment management as well as development management, possibly reflecting the increased visibility of standard project management methods. Originality/value – The relevance of traditional standard methods of IS development has been questioned in a changing and more dynamic IS development environment. This study provides an updated assessment of standard method use in New Zealand organisations that will be of interest to researchers and practitioners monitoring IS development and acquisition elsewhere

    Information Systems and Assemblages

    Get PDF
    International audienceThe theme for the 2014 IFIP WG 8.2 working conference was ‘Information Systems and Global Assemblages: (Re)Configuring Actors, Artefacts, Organizations’. The motivation behind the choice of the conference theme has been the increasing appreciation of notions of emergence, heterogeneity and temporality in IS studies. We found that the conference provided an opportune occasion for inviting scholars interested in exploring these notions, their relevance and promise for IS studies. The concept of the ‘assemblage’ [1], already referenced in IS studies, as will be discussed below, and with significant popularity in other fields, such as anthropology, geography and cultural studies, provided the stepping stone for approaching the heterogeneous, emergent and situated nature of information systems and organization. In particular, we opted for highlighting the ‘global assemblage’[2] as a metaphor to talk about challenging yet often creative tensions that emerge as global imperatives (geographical, intellectual, procedural and others) interact with local arrangements of actors, artefacts and organizations. Here ‘global’ does not mean universal or everywhere, but mobile, abstractable, and capable of recontextualization across diverse social and cultural situations.This book provides a collection of contributions by scholars who responded to our invitation, adding depth and breadth to our understanding of the concept and its value for IS studies. At the same time, some contributors chose to discuss emergence, heterogeneity and situatedness in different terms, drawing upon alternative theoretical traditions and concepts. The result has been an engaging and stimulating mix of ideas that points towards the ‘multiple’ trajectories - current and future - of this exciting stream of research

    Real estate investment and urban density: Exploring the polycentric urban region using a topological lens

    Get PDF
    Focusing on commercial office real estate as both a manifestation of and a conduit of cross-border capital flows, this paper refers to the concepts of topology and topography in a theoretical and empirical exploration of contemporary ‘network economy’ spatial implications for the ‘polycentric urban region’ (PUR). A body of research has cast doubt on the normative European representation of the multi-centre PUR as a balanced, sustainable spatial development model. Yet, the model has continued to be propagated in European territorial strategy and has been influential internationally. Academic perspectives and qualitative evidence reviewed in the paper shed a light on mutual dependencies and recursive relations between network economy global structural processes, international office real estate investment practices mediated by city governments, and the spatial configuration of density. Commercial investment and city planning actor practices chime with urban agglomeration, spatial concentration and density. Quantitative evidence of associations between urban density and office real estate investment returns and capital flows is found. It is concluded that network economy topology, politics and the city are in a dialectical relationship with the PUR territorial governance agenda for spatially balanced regional development

    Supporting smart urban growth: successful investing in density

    Get PDF
    This report analyses the characteristics of ‘good density’ and begins to quantify the relation-ship between these characteristics, investor returns, and carbon emissions. We found that cities with good density – that is, dense development thoughtfully designed to promote a high quality of life – are likely to be more resilient and prosperous in the long term, and there-fore more likely to provide sustainable returns for investors, than cities without good density. Based on a quantitative analysis of 63 global cities, the report finds that cities with good density are associated with higher returns, capital values, and levels of investment flows for commercial real estate. The research provides evidence of important issues for the long-term resilience of cities both in the OECD and in fast-growing developing regions

    N-terminal pro-B-type natriuretic peptide and the prediction of primary cardiovascular events: results from 15-year follow-up of WOSCOPS

    Get PDF
    <b>Aims:</b>To test whether N-terminal pro-B-type natriuretic peptide (NT-proBNP) was independently associated with, and improved the prediction of, cardiovascular disease (CVD) in a primary prevention cohort. <b>Methods and results:</b> In the West of Scotland Coronary Prevention Study (WOSCOPS), a cohort of middle-aged men with hypercholesterolaemia at a moderate risk of CVD, we related the baseline NT-proBNP (geometric mean 28 pg/mL) in 4801 men to the risk of CVD over 15 years during which 1690 experienced CVD events. Taking into account the competing risk of non-CVD death, NT-proBNP was associated with an increased risk of all CVD [HR: 1.17 (95% CI: 1.11–1.23) per standard deviation increase in log NT-proBNP] after adjustment for classical and clinical cardiovascular risk factors plus C-reactive protein. N-terminal pro-B-type natriuretic peptide was more strongly related to the risk of fatal [HR: 1.34 (95% CI: 1.19–1.52)] than non-fatal CVD [HR: 1.17 (95% CI: 1.10–1.24)] (P= 0.022). The addition of NT-proBNP to traditional risk factors improved the C-index (+0.013; P < 0.001). The continuous net reclassification index improved with the addition of NT-proBNP by 19.8% (95% CI: 13.6–25.9%) compared with 9.8% (95% CI: 4.2–15.6%) with the addition of C-reactive protein. N-terminal pro-B-type natriuretic peptide correctly reclassified 14.7% of events, whereas C-reactive protein correctly reclassified 3.4% of events. Results were similar in the 4128 men without evidence of angina, nitrate prescription, minor ECG abnormalities, or prior cerebrovascular disease. <b>Conclusion:</b> N-terminal pro-B-type natriuretic peptide predicts CVD events in men without clinical evidence of CHD, angina, or history of stroke, and appears related more strongly to the risk for fatal events. N-terminal pro-B-type natriuretic peptide also provides moderate risk discrimination, in excess of that provided by the measurement of C-reactive protein

    Cellular adaptations to hypoxia and acidosis during somatic evolution of breast cancer

    Get PDF
    Conceptual models of carcinogenesis typically consist of an evolutionary sequence of heritable changes in genes controlling proliferation, apoptosis, and senescence. We propose that these steps are necessary but not sufficient to produce invasive breast cancer because intraductal tumour growth is also constrained by hypoxia and acidosis that develop as cells proliferate into the lumen and away from the underlying vessels. This requires evolution of glycolytic and acid-resistant phenotypes that, we hypothesise, is critical for emergence of invasive cancer. Mathematical models demonstrate severe hypoxia and acidosis in regions of intraductal tumours more than 100 m from the basement membrane. Subsequent evolution of glycolytic and acid-resistant phenotypes leads to invasive proliferation. Multicellular spheroids recapitulating ductal carcinoma in situ (DCIS) microenvironmental conditions demonstrate upregulated glucose transporter 1 (GLUT1) as adaptation to hypoxia followed by growth into normoxic regions in qualitative agreement with model predictions. Clinical specimens of DCIS exhibit periluminal distribution of GLUT-1 and Na+/H+ exchanger (NHE) indicating transcriptional activation by hypoxia and clusters of the same phenotype in the peripheral, presumably normoxic regions similar to the pattern predicted by the models and observed in spheroids. Upregulated GLUT-1 and NHE-1 were observed in microinvasive foci and adjacent intraductal cells. Adaptation to hypoxia and acidosis may represent key events in transition from in situ to invasive cancer

    Daidalos Security Framework for Mobile Services

    Get PDF
    Mobility is now the central focus of the lives of European citizens in business, education, and leisure. This will be enriched by pervasiveness in the future. The Daidalos vision is to seamlessly integrate heterogeneous network technologies that allow network operators and service providers to offer new and profitable services, giving users access to a wide range of pervasive, personalised voice, data, and multimedia services. This paper discusses the security issues that need to be addressed to make Daidalos a real viable solution for future pervasive mobility. Issues include among others privacy & identity management, secure protocols, distributed key management, security in ad hoc networks

    Why is it difficult to implement e-health initiatives? A qualitative study

    Get PDF
    <b>Background</b> The use of information and communication technologies in healthcare is seen as essential for high quality and cost-effective healthcare. However, implementation of e-health initiatives has often been problematic, with many failing to demonstrate predicted benefits. This study aimed to explore and understand the experiences of implementers - the senior managers and other staff charged with implementing e-health initiatives and their assessment of factors which promote or inhibit the successful implementation, embedding, and integration of e-health initiatives.<p></p> <b>Methods</b> We used a case study methodology, using semi-structured interviews with implementers for data collection. Case studies were selected to provide a range of healthcare contexts (primary, secondary, community care), e-health initiatives, and degrees of normalization. The initiatives studied were Picture Archiving and Communication System (PACS) in secondary care, a Community Nurse Information System (CNIS) in community care, and Choose and Book (C&B) across the primary-secondary care interface. Implementers were selected to provide a range of seniority, including chief executive officers, middle managers, and staff with 'on the ground' experience. Interview data were analyzed using a framework derived from Normalization Process Theory (NPT).<p></p> <b>Results</b> Twenty-three interviews were completed across the three case studies. There were wide differences in experiences of implementation and embedding across these case studies; these differences were well explained by collective action components of NPT. New technology was most likely to 'normalize' where implementers perceived that it had a positive impact on interactions between professionals and patients and between different professional groups, and fit well with the organisational goals and skill sets of existing staff. However, where implementers perceived problems in one or more of these areas, they also perceived a lower level of normalization.<p></p> <b>Conclusions</b> Implementers had rich understandings of barriers and facilitators to successful implementation of e-health initiatives, and their views should continue to be sought in future research. NPT can be used to explain observed variations in implementation processes, and may be useful in drawing planners' attention to potential problems with a view to addressing them during implementation planning
    corecore